Automatic discrimination between laughter and speech
نویسندگان
چکیده
منابع مشابه
Automatic discrimination between laughter and speech
Emotions can be recognized by audible paralinguistic cues in speech. By detecting these paralinguistic cues that can consist of laughter, a trembling voice, coughs, changes in the intonation contour etc., information about the speaker’s state and emotion can be revealed. This paper describes the development of a gender-independent laugh detector with the aim to enable automatic emotion recognit...
متن کاملAutomatic Laughter Segmentation
Our goal in this work was to develop an accurate method to identify laughter segments, ultimately for the purpose of speaker recognition. Our previous work used MLPs to perform frame level detection of laughter using short-term features, including MFCCs and pitch, and achieved a 7.9% EER on the ICSI Meeting Recorder vocalized test set. We improved upon our previous results by including high-lev...
متن کاملNew Feature For Automatic Speech/Music Discrimination
This paper presents a new mechanism for automatic speech/music discrimination (SMD). Such feature is based on the concept of multiple fundamental frequencies. The performance of the feature in terms of correct classifications is evaluated for a wide variety of audio signals, and factors such as computational complexity and robustness are also investigated. The results are compared to those ones...
متن کاملAutomatic detection of laughter
In the context of detecting ‘paralinguistic events’ with the aim to make classification of the speaker’s emotional state possible, a detector was developed for one of the most obvious ‘paralinguistic events’, namely laughter. Gaussian Mixture Models were trained with Perceptual Linear Prediction features, pitch&energy, pitch&voicing and modulation spectrum features to model laughter and speech....
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Speech Communication
سال: 2007
ISSN: 0167-6393
DOI: 10.1016/j.specom.2007.01.001